Memory error while training data set for nltk classifiers

by: Lakshay Sharma, 9 years ago


Hi,
I am using system for 4gb ram and i5 3320 i am getting memory error for training classifiers for more than 1000 words is there any way to rectify the error? please help



You must be logged in to post. Please login or register an account.



Are you making use of all 4gb? Are you running 32 or 64 bit Python? The default is 32 bit, regardless of your operating system. If you're using 32 bit Python, then the memory limit is 2GB. Alternatively, you can download the trained files from https://github.com/PythonProgramming/NLTK-3----Natural-Language-Processing-with-Python-series

-Harrison 9 years ago

You must be logged in to post. Please login or register an account.


yeah i am using 32 bit that must be the case thanks appreciate it.

-Lakshay Sharma 9 years ago

You must be logged in to post. Please login or register an account.